-
Notifications
You must be signed in to change notification settings - Fork 645
feat: Connect DYN_LOG level to TLLM_LOG_LEVEL #3451
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
WalkthroughIntroduces pre-import configuration to set TLLM_LOG_LEVEL from DYN_LOG before importing TensorRT-LLM. Adds mapping and configuration functions to the logging module, integrates TensorRT-LLM logging setup into existing configure_dynamo_logging, and respects overrides via existing environment variables or a skip flag. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
participant Env as Process Env
participant Main as dynamo/trtllm/main.py
participant L as logging.py
participant TRT as tensorrt_llm
Note over Main: Module import start
Main->>Env: Check TLLM_LOG_LEVEL
alt Not set
Main->>L: map_dyn_log_to_tllm_level(DYN_LOG)
L-->>Main: TLLM level string
Main->>Env: Set TLLM_LOG_LEVEL
else Already set
Note over Main: Preserve existing value
end
Main->>TRT: import tensorrt_llm...
Note over TRT: Initializes with established log level
sequenceDiagram
autonumber
participant App as Application
participant Log as configure_dynamo_logging
participant L as logging.py
participant Env as Process Env
App->>Log: configure_dynamo_logging(...)
alt DYN_SKIP_TRTLLM_LOG_FORMATTING not set
Log->>L: configure_trtllm_logging(dyn_level)
L->>Env: If TLLM_LOG_LEVEL unset, set from map_dyn_log_to_tllm_level(DYN_LOG)
else Skip flag set
Note over Log,Env: Do not modify TLLM_LOG_LEVEL
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Poem
Pre-merge checks❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
components/src/dynamo/trtllm/main.py
(1 hunks)lib/bindings/python/src/dynamo/runtime/logging.py
(2 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
components/src/dynamo/trtllm/main.py (1)
lib/bindings/python/src/dynamo/runtime/logging.py (1)
map_dyn_log_to_tllm_level
(198-222)
🪛 Ruff (0.13.3)
lib/bindings/python/src/dynamo/runtime/logging.py
225-225: Unused function argument: dyn_level
(ARG001)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
- GitHub Check: trtllm (amd64)
- GitHub Check: Build and Test - dynamo
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Signed-off-by: Tanmay Verma <[email protected]>
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Signed-off-by: Tanmay Verma <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks Tanmay! Just needs pre-commit fixed.
Signed-off-by: Tanmay Verma <[email protected]> Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com> Signed-off-by: Piotr Tarasiewicz <[email protected]>
Overview:
This fixes the logging within TRTLLM engine. The code will set TLLM_LOG_LEVEL=INFO by default which would allow the print_iter_log to appear.
Without this TRTLLM logs will not appear during inference.
Before
After
Summary by CodeRabbit